On the convergence to equilibrium of Kac’s random walk on matrices
نویسنده
چکیده
We consider Kac’s random walk on n-dimensional rotation matrices, where each step is a random rotation in the plane generated by two randomly picked coordinates. We show that this process converges to the uniform (Haar) measure in the (Wasserstein) transportation cost metric in O ( n2 lnn ) steps. This improves on previous results of Diaconis/Saloff Coste and Pak/Sidenko and is a lnn factor away from being optimal. Our proof method includes a general result akin to the path coupling method of Bubley and Dyer. Suppose that P is a Markov chain on a Polish length space (M,d) and that for all x, y ∈ M with d(x, y) ≪ 1 there is a coupling (X,Y ) of one step from P from x and y (respectively) that is (c + o (1))-contracting on average. Then the map μ 7→ μP is c-contracting in the transportation cost metric. Other applications of this result are also presented.
منابع مشابه
Convergence of Kac’s Random Walk
We study a long standing open problem on the mixing time of Kac’s random walk on SO(n,R) by random rotations. We obtain an upper bound mix = O(n log n) for the weak convergence which is close to the trivial lower bound Ω(n). This improves the upper bound O(n log n) by Diaconis and Saloff-Coste [9]. The proof is a variation on the coupling technique we develop to bound the mixing time for compac...
متن کاملOn explicit occupation time distributions for Brownian processes
Explicit expressions for the occupation time distribution of Brownian bridge, excursion and meander are derived using Kac’s formula and results of [4]. The first two distributions appeared in [8], and were derived using weak convergence of simple random walk.
متن کاملTotal Variation Bound for Kac’s Random Walk
We show that the classical Kac’s random walk on S starting from the point mass at e1 mixes in O(n log n) steps in total variation distance. This improves a previous bound by Diaconis and Saloff-Coste of O(n).
متن کاملCompressing Deep Neural Networks: A New Hashing Pipeline Using Kac's Random Walk Matrices
The popularity of deep learning is increasing by the day. However, despite the recent advancements in hardware, deep neural networks remain computationally intensive. Recent work has shown that by preserving the angular distance between vectors, random feature maps are able to reduce dimensionality without introducing bias to the estimator. We test a variety of established hashing pipelines as ...
متن کاملLocal Perturbations of Energy and Kac’s Return Time Theorem
A bstract. We introduce the notion of local perturbations for normalized energies and study their effect on the level of equilibrium measures. Using coupling technics and Kac’s return time theorem, we obtain some d̄-estimates for the equilibrium measures. These reveal stability of certain energies under local perturbations. They also show how some weak-⋆ convergence of equilibrium may be obtaine...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009